Goto

Collaborating Authors

 rectal cancer


Interpretable Prediction of Lymph Node Metastasis in Rectal Cancer MRI Using Variational Autoencoders

Keel, Benjamin, Quyn, Aaron, Jayne, David, Mohsin, Maryam, Relton, Samuel D.

arXiv.org Artificial Intelligence

Effective treatment for rectal cancer relies on accurate lymph node metastasis (LNM) staging. However, radiological criteria based on lymph node (LN) size, shape and texture morphology have limited diagnostic accuracy. In this work, we investigate applying a Variational Autoencoder (VAE) as a feature encoder model to replace the large pre-trained Convolutional Neural Network (CNN) used in existing approaches. The motivation for using a VAE is that the generative model aims to reconstruct the images, so it directly encodes visual features and meaningful patterns across the data. This leads to a disentangled and structured latent space which can be more interpretable than a CNN. Models are deployed on an in-house MRI dataset with 168 patients who did not undergo neo-adjuvant treatment. The post-operative pathological N stage was used as the ground truth to evaluate model predictions. Our proposed model 'VAE-MLP' achieved state-of-the-art performance on the MRI dataset, with cross-validated metrics of AUC 0.86 +/- 0.05, Sensitivity 0.79 +/- 0.06, and Specificity 0.85 +/- 0.05. Code is available at: https://github.com/benkeel/Lymph_Node_Classification_MIUA.


LRMR: LLM-Driven Relational Multi-node Ranking for Lymph Node Metastasis Assessment in Rectal Cancer

Dong, Yaoxian, Gao, Yifan, Li, Haoyue, Cui, Yanfen, Gao, Xin

arXiv.org Artificial Intelligence

Accurate preoperative assessment of lymph node (LN) metastasis in rectal cancer guides treatment decisions, yet conventional MRI evaluation based on morphological criteria shows limited diagnostic performance. While some artificial intelligence models have been developed, they often operate as black boxes, lacking the interpretability needed for clinical trust. Moreover, these models typically evaluate nodes in isolation, overlooking the patient-level context. To address these limitations, we introduce LRMR, an LLM-Driven Relational Multi-node Ranking framework. This approach reframes the diagnostic task from a direct classification problem into a structured reasoning and ranking process. The LRMR framework operates in two stages. First, a multimodal large language model (LLM) analyzes a composite montage image of all LNs from a patient, generating a structured report that details ten distinct radiological features. Second, a text-based LLM performs pairwise comparisons of these reports between different patients, establishing a relative risk ranking based on the severity and number of adverse features. We evaluated our method on a retrospective cohort of 117 rectal cancer patients. LRMR achieved an area under the curve (AUC) of 0.7917 and an F1-score of 0.7200, outperforming a range of deep learning baselines, including ResNet50 (AUC 0.7708). Ablation studies confirmed the value of our two main contributions: removing the relational ranking stage or the structured prompting stage led to a significant performance drop, with AUCs falling to 0.6875 and 0.6458, respectively. Our work demonstrates that decoupling visual perception from cognitive reasoning through a two-stage LLM framework offers a powerful, interpretable, and effective new paradigm for assessing lymph node metastasis in rectal cancer.


DRIMV_TSK: An Interpretable Surgical Evaluation Model for Incomplete Multi-View Rectal Cancer Data

Zhang, Wei, Wang, Zi, Zhou, Hanwen, Deng, Zhaohong, Ding, Weiping, Ge, Yuxi, Zhang, Te, Zhang, Yuanpeng, Choi, Kup-Sze, Wang, Shitong, Hu, Shudong

arXiv.org Artificial Intelligence

A reliable evaluation of surgical difficulty can improve the success of the treatment for rectal cancer and the current evaluation method is based on clinical data. However, more data about rectal cancer can be collected with the development of technology. Meanwhile, with the development of artificial intelligence, its application in rectal cancer treatment is becoming possible. In this paper, a multi-view rectal cancer dataset is first constructed to give a more comprehensive view of patients, including the high-resolution MRI image view, pressed-fat MRI image view, and clinical data view. Then, an interpretable incomplete multi-view surgical evaluation model is proposed, considering that it is hard to obtain extensive and complete patient data in real application scenarios. Specifically, a dual representation incomplete multi-view learning model is first proposed to extract the common information between views and specific information in each view. In this model, the missing view imputation is integrated into representation learning, and second-order similarity constraint is also introduced to improve the cooperative learning between these two parts. Then, based on the imputed multi-view data and the learned dual representation, a multi-view surgical evaluation model with the TSK fuzzy system is proposed. In the proposed model, a cooperative learning mechanism is constructed to explore the consistent information between views, and Shannon entropy is also introduced to adapt the view weight. On the MVRC dataset, we compared it with several advanced algorithms and DRIMV_TSK obtained the best results.


Meply: A Large-scale Dataset and Baseline Evaluations for Metastatic Perirectal Lymph Node Detection and Segmentation

Guo, Weidong, Zhang, Hantao, Wan, Shouhong, Zou, Bingbing, Wang, Wanqin, Qiu, Chenyang, Li, Jun, Jin, Peiquan

arXiv.org Artificial Intelligence

Accurate segmentation of metastatic lymph nodes in rectal cancer is crucial for the staging and treatment of rectal cancer. However, existing segmentation approaches face challenges due to the absence of pixel-level annotated datasets tailored for lymph nodes around the rectum. Additionally, metastatic lymph nodes are characterized by their relatively small size, irregular shapes, and lower contrast compared to the background, further complicating the segmentation task. To address these challenges, we present the first large-scale perirectal metastatic lymph node CT image dataset called Meply, which encompasses pixel-level annotations of 269 patients diagnosed with rectal cancer. Furthermore, we introduce a novel lymph-node segmentation model named CoSAM. The CoSAM utilizes sequence-based detection to guide the segmentation of metastatic lymph nodes in rectal cancer, contributing to improved localization performance for the segmentation model. It comprises three key components: sequence-based detection module, segmentation module, and collaborative convergence unit. To evaluate the effectiveness of CoSAM, we systematically compare its performance with several popular segmentation methods using the Meply dataset. Our code and dataset will be publicly available at: https://github.com/kanydao/CoSAM.


Workspace Analysis for Laparoscopic Rectal Surgery : A Preliminary Study

Thomieres, Alexandra, Khanzode, Dhruva, Duchalais, Emilie, Jha, Ranjan, Chablat, Damien

arXiv.org Artificial Intelligence

The integration of medical imaging, computational analysis, and robotic technology has brought about a significant transformation in minimally invasive surgical procedures, particularly in the realm of laparoscopic rectal surgery (LRS). This specialized surgical technique, aimed at addressing rectal cancer, requires an in-depth comprehension of the spatial dynamics within the narrow space of the pelvis. Leveraging Magnetic Resonance Imaging (MRI) scans as a foundational dataset, this study incorporates them into Computer-Aided Design (CAD) software to generate precise three-dimensional (3D) reconstructions of the patient's anatomy. At the core of this research is the analysis of the surgical workspace, a critical aspect in the optimization of robotic interventions. Sophisticated computational algorithms process MRI data within the CAD environment, meticulously calculating the dimensions and contours of the pelvic internal regions. The outcome is a nuanced understanding of both viable and restricted zones during LRS, taking into account factors such as curvature, diameter variations, and potential obstacles. This paper delves deeply into the complexities of workspace analysis for robotic LRS, illustrating the seamless collaboration between medical imaging, CAD software, and surgical robotics. Through this interdisciplinary approach, the study aims to surpass traditional surgical methodologies, offering novel insights for a paradigm shift in optimizing robotic interventions within the complex environment of the pelvis.


Image Synthesis-based Late Stage Cancer Augmentation and Semi-Supervised Segmentation for MRI Rectal Cancer Staging

Sasuga, Saeko, Kudo, Akira, Kitamura, Yoshiro, Iizuka, Satoshi, Simo-Serra, Edgar, Hamabe, Atsushi, Ishii, Masayuki, Takemasa, Ichiro

arXiv.org Artificial Intelligence

Rectal cancer is one of the most common diseases and a major cause of mortality. For deciding rectal cancer treatment plans, T-staging is important. However, evaluating the index from preoperative MRI images requires high radiologists' skill and experience. Therefore, the aim of this study is to segment the mesorectum, rectum, and rectal cancer region so that the system can predict T-stage from segmentation results. Generally, shortage of large and diverse dataset and high quality annotation are known to be the bottlenecks in computer aided diagnostics development. Regarding rectal cancer, advanced cancer images are very rare, and per-pixel annotation requires high radiologists' skill and time. Therefore, it is not feasible to collect comprehensive disease patterns in a training dataset. To tackle this, we propose two kinds of approaches of image synthesis-based late stage cancer augmentation and semi-supervised learning which is designed for T-stage prediction. In the image synthesis data augmentation approach, we generated advanced cancer images from labels. The real cancer labels were deformed to resemble advanced cancer labels by artificial cancer progress simulation. Next, we introduce a T-staging loss which enables us to train segmentation models from per-image T-stage labels. The loss works to keep inclusion/invasion relationships between rectum and cancer region consistent to the ground truth T-stage. The verification tests show that the proposed method obtains the best sensitivity (0.76) and specificity (0.80) in distinguishing between over T3 stage and underT2. In the ablation studies, our semi-supervised learning approach with the T-staging loss improved specificity by 0.13. Adding the image synthesis-based data augmentation improved the DICE score of invasion cancer area by 0.08 from baseline.


Automation of Radiation Treatment Planning for Rectal Cancer

Huang, Kai, Das, Prajnan, Olanrewaju, Adenike M., Cardenas, Carlos, Fuentes, David, Zhang, Lifei, Hancock, Donald, Simonds, Hannah, Rhee, Dong Joo, Beddar, Sam, Briere, Tina Marie, Court, Laurence

arXiv.org Artificial Intelligence

To develop an automated workflow for rectal cancer three-dimensional conformal radiotherapy treatment planning that combines deep-learning(DL) aperture predictions and forward-planning algorithms. We designed an algorithm to automate the clinical workflow for planning with field-in-field. DL models were trained, validated, and tested on 555 patients to automatically generate aperture shapes for primary and boost fields. Network inputs were digitally reconstructed radiography, gross tumor volume(GTV), and nodal GTV. A physician scored each aperture for 20 patients on a 5-point scale(>3 acceptable). A planning algorithm was then developed to create a homogeneous dose using a combination of wedges and subfields. The algorithm iteratively identifies a hotspot volume, creates a subfield, and optimizes beam weight all without user intervention. The algorithm was tested on 20 patients using clinical apertures with different settings, and the resulting plans(4 plans/patient) were scored by a physician. The end-to-end workflow was tested and scored by a physician on 39 patients using DL-generated apertures and planning algorithms. The predicted apertures had Dice scores of 0.95, 0.94, and 0.90 for posterior-anterior, laterals, and boost fields, respectively. 100%, 95%, and 87.5% of the posterior-anterior, laterals, and boost apertures were scored as clinically acceptable, respectively. Wedged and non-wedged plans were clinically acceptable for 85% and 50% of patients, respectively. The final plans hotspot dose percentage was reduced from 121%($\pm$ 14%) to 109%($\pm$ 5%) of prescription dose. The integrated end-to-end workflow of automatically generated apertures and optimized field-in-field planning gave clinically acceptable plans for 38/39(97%) of patients. We have successfully automated the clinical workflow for generating radiotherapy plans for rectal cancer for our institution.


Four armed robot carries out major surgery in Britain for the first time

Daily Mail - Science & tech

A patient has become the first in Britain to be operated on for major cancer surgery by a robot. Dean Walter, 41, had a full pelvic extraction in which his bladder, prostate, rectum and lower colon were removed through a cut in his abdomen only 2in wide. The operation usually requires a surgeon and three assistants to cut the patient open from their chest to their groin. Mr Walter, a former fitness model, would have needed three weeks in hospital to recover from traditional surgery – but he was ready to go home eight days after the robotic procedure for rectal cancer. It was done using a £2million Da Vinci Xi robot, which has four arms for cutting tissue, sealing blood vessels and filming inside the body with a 3D camera.


Robots are no better at performing surgery than humans

Daily Mail - Science & tech

Highly-trained surgeons armed with a scalpel perform procedures faster than machines, at a lower cost - and do not make more mistakes. Two studies - one by Leeds University and the other by Stanford University in the US - last night independently found robots did not reduce side effects or improve the patient's health when compared to manual operations. But they both found that robotic surgery took longer and was more expensive. For the last 15 years robots have been increasingly used in the NHS, replacing the surgeon's hand with the arm of a machine. The NHS has about 60 surgical robots, often bought for about £1.5million each by hospitals' charitable trusts after local fundraising campaigns.